-
-
Notifications
You must be signed in to change notification settings - Fork 303
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix for Error Occurring When "use_focal_loss=False" in Postprocessor #429
base: main
Are you sure you want to change the base?
Conversation
You can not only use If you want it works, you should add If you want to conduct ablation experiments, I suggest that you configure these parameters separately in each module. |
@lyuwenyu if use_sigmoid:
scores = torch.sigmoid(out_logits)
...
else:
scores = torch.nn.functional.softmax(out_logits, dim=-1)
... This way, we preserve the original behavior while adding the option to use |
It looks good to rename |
@lyuwenyu if use_sigmoid:
scores = torch.sigmoid(out_logits)
...
else:
scores = torch.nn.functional.softmax(out_logits, dim=-1)
... |
Yes. But It may have |
For the sake of code consistency and readability, I believe it would be beneficial to maintain the |
But |
Thank you for your kind explanation and clarification. I appreciate you informing me about the original intention of covering cases No.0 and No.2. This insight is very helpful. However, I noticed that in your current code, there doesn't seem to be any implementation for adding a From my analysis, the current code structure seems to align more closely with cases No.0 and No.1. Could you please provide some clarification on how you envision handling case No.2 within the current framework? This would help ensure that our implementation accurately reflects the intended functionality of the model. Thank you again for your patience and guidance throughout this process. I look forward to your thoughts on this matter. |
@lyuwenyu |
If you modify the name
As you said, the exact meaning was not expressed here, but you can explain it through annotations |
@lyuwenyu, thank you for your reply! I agree that making significant changes to the config isn't ideal, as you pointed out.
As you mentioned earlier, there's been extensive debate about whether a background class (or void class) is necessary. We haven't reached a conclusion yet, and we're eagerly awaiting your input on this matter. Your insights would be invaluable in resolving this issue and improving the model's implementation across different platforms. We appreciate your time and expertise in guiding us through this process. |
Fix for Error Occurring When "use_focal_loss=False" in Postprocessor
scores = F.softmax(logits)[:, :, :-1]
Modification
scores = F.softmax(logits)[:, :, :-1]
->
scores = F.softmax(logits, dim=-1)
Example code
When
use_focal_loss=True
resultsWhen
use_focal_loss=False
results, BEFORE THE FIX.scores = F.softmax(logits)[:, :, :-1]
When
use_focal_loss=False
results, AFTER THE FIX.scores = F.softmax(logits, dim=-1)